Incremental learning with temporary memory

نویسندگان

  • Sanjay Jain
  • Steffen Lange
  • Samuel E. Moelius
  • Sandra Zilles
چکیده

In the inductive inference framework of learning in the limit, a variation of the bounded example memory (Bem) language learning model is considered. Intuitively, the new model constrains the learner’s memory not only in how much data may be retained, but also in how long that data may be retained. More specifically, the model requires that, if a learner commits an example x to memory in some stage of the learning process, then there is some subsequent stage for which x no longer appears in the learner’s memory. This model is called temporary example memory (Tem) learning. In some sense, it captures the idea that memories fade. Many interesting results concerning the Tem-learning model are presented. For example, there exists a class of languages that can be identified by memorizing k + 1 examples in the Tem sense, but that cannot be identified by memorizing k examples in the Bem sense. On the other hand, there exists a class of languages that can be identified by memorizing just 1 example in the Bem sense, but that cannot be identified by memorizing any number of examples in the Tem sense. (The proof of this latter result involves an infinitary self-reference argument.) Results are also presented concerning the special cases of: learning indexable classes of languages, and learning (arbitrary) classes of infinite languages.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The effect of reversible inactivation of raphe nuclus on learning and memory in rats

The role of raphe nucleus (R.N) and serotonin in some behaviors such as sleep, cognition, mood, and memory has previously been reported. The median raphe (MR) nucleus is a major serotonin-containing cell group within the brainstem and is one of the main sources of projections to the septum and hippocampus. The hippocampus is widely believed to be essential for context-conditioning learning. Mor...

متن کامل

The effect of reversible inactivation of raphe nuclus on learning and memory in rats

The role of raphe nucleus (R.N) and serotonin in some behaviors such as sleep, cognition, mood, and memory has previously been reported. The median raphe (MR) nucleus is a major serotonin-containing cell group within the brainstem and is one of the main sources of projections to the septum and hippocampus. The hippocampus is widely believed to be essential for context-conditioning learning. Mor...

متن کامل

A Fast Incremental Learning Algorithm of RBF Networks with Long-Term Memory

To avoid the catastrophic interference in incremental learning, we have proposed Resource Allocating Network with Long Term Memory (RAN-LTM). In RAN-LTM, not only a new training sample but also some memory items stored in Long-Term Memory are trained based on a gradient descent algorithm. In general, the gradient descent algorithm is usually slow and can be easily fallen into local minima. To s...

متن کامل

On the effect of low-quality node observation on learning over incremental adaptive networks

In this paper, we study the impact of low-quality node on the performance of incremental least mean square (ILMS) adaptive networks. Adaptive networks involve many nodes with adaptation and learning capabilities. Low-quality mode in the performance of a node in a practical sensor network is modeled by the observation of pure noise (its observation noise) that leads to an unreliable measurement....

متن کامل

FearNet: Brain-Inspired Model for Incremental Learning

Incremental class learning involves sequentially learning classes in bursts of examples from the same class. This violates the assumptions that underlie methods for training standard deep neural networks, and will cause them to suffer from catastrophic forgetting. Arguably, the best method for incremental class learning is iCaRL, but it requires storing training examples for each class, making ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Theor. Comput. Sci.

دوره 411  شماره 

صفحات  -

تاریخ انتشار 2010